Neural network analytic continuation for Monte Carlo: Improvement by statistical errors

نویسندگان

چکیده

This study explores the use of neural network-based analytic continuation to extract spectra from Monte Carlo data. We apply this technique both synthetic and Carlo-generated The training sets for networks are carefully synthesized without “data leakage”. find that set should match input correlation functions in terms statistical error properties, such as noise level, dependence on imaginary time, time-displaced correlations. have developed a systematic method synthesize datasets. Our improved algorithm outperforms widely used maximum entropy highly noisy situations. As an example, our successfully extracted dynamic structure factor spin-1/2 Heisenberg chain quantum simulations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Maximum Entropy Analytic Continuation of Quantum Monte Carlo Data

We present a pedagogical discussion of the Maximum Entropy Method which is a precise and systematic way of analytically continuing Euclidean-time quantum Monte Carlo results to real frequencies. Here, Bayesian statistics are used to determine which of the infinite number of real-frequency spectra are consistent with the QMC data is most probable. Bayesian inference is also used to qualify the s...

متن کامل

Analytic continuation of quantum Monte Carlo data by stochastic analytical inference.

We present an algorithm for the analytic continuation of imaginary-time quantum Monte Carlo data which is strictly based on principles of Bayesian statistical inference. Within this framework we are able to obtain an explicit expression for the calculation of a weighted average over possible energy spectra, which can be evaluated by standard Monte Carlo simulations, yielding as by-product also ...

متن کامل

Neural Network Gradient Hamiltonian Monte Carlo

Hamiltonian Monte Carlo is a widely used algorithm for sampling from posterior distributions of complex Bayesian models. It can efficiently explore high-dimensional parameter spaces guided by simulated Hamiltonian flows. However, the algorithm requires repeated gradient calculations, and these computations become increasingly burdensome as data sets scale. We present a method to substantially r...

متن کامل

Monte Carlo Errors with Less Errors

We explain in detail how to estimate mean values and assess statistical errors for arbitrary functions of elementary observables in Monte Carlo simulations. The method is to estimate and sum the relevant autocorrelation functions, which is argued to produce more certain error estimates than binning techniques and hence to help toward a better exploitation of expensive simulations. An effective ...

متن کامل

A continuation multilevel Monte Carlo algorithm

We propose a novel Continuation Multi Level Monte Carlo (CMLMC) algorithm for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The CMLMC algorithm solves the given approximation problem for a sequence of decreasing tolerances, ending with the desired one. CMLMC assumes discretization hiera...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Chinese Physics B

سال: 2023

ISSN: ['2058-3834', '1674-1056']

DOI: https://doi.org/10.1088/1674-1056/accd4c